Faster Ridge Regression via the Subsampled Randomized Hadamard Transform

نویسندگان

  • Yichao Lu
  • Paramveer S. Dhillon
  • Dean P. Foster
  • Lyle H. Ungar
چکیده

We propose a fast algorithm for ridge regression when the number of features is much larger than the number of observations (p n). The standard way to solve ridge regression in this setting works in the dual space and gives a running time of O(np). Our algorithm Subsampled Randomized Hadamard TransformDual Ridge Regression (SRHT-DRR) runs in time O(np log(n)) and works by preconditioning the design matrix by a Randomized Walsh-Hadamard Transform with a subsequent subsampling of features. We provide risk bounds for our SRHT-DRR algorithm in the fixed design setting and show experimental results on synthetic and real datasets.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improved Low-rank Matrix Decompositions via the Subsampled Randomized Hadamard Transform

We comment on two randomized algorithms for constructing low-rank matrix decompositions. Both algorithms employ the Subsampled Randomized Hadamard Transform [14]. The first algorithm appeared recently in [9]; here, we provide a novel analysis that significantly improves the approximation bound obtained in [9]. A preliminary version of the second algorithm appeared in [7]; here, we present a mil...

متن کامل

Improved matrix algorithms via the Subsampled Randomized Hadamard Transform

Several recent randomized linear algebra algorithms rely upon fast dimension reduction methods. A popular choice is the Subsampled Randomized Hadamard Transform (SRHT). In this article, we address the efficacy, in the Frobenius and spectral norms, of an SRHT-based low-rank matrix approximation technique introduced by Woolfe, Liberty, Rohklin, and Tygert. We establish a slightly better Frobenius...

متن کامل

Improved Analysis of the subsampled Randomized Hadamard Transform

This paper presents an improved analysis of a structured dimension-reduction map called the subsampled randomized Hadamard transform. This argument demonstrates that the map preserves the Euclidean geometry of an entire subspace of vectors. The new proof is much simpler than previous approaches, and it offers—for the first time—optimal constants in the estimate on the number of dimensions requi...

متن کامل

RADAGRAD: Random Projections for Adaptive Stochastic Optimization

We present RADAGRAD a simple and computationally efficient approximation to full-matrix ADAGRAD based on dimensionality reduction using the subsampled randomized Hadamard transform. RADAGRAD is able to capture correlations in the gradients and achieves a similar regret – in theory and empirically – to fullmatrix ADAGRAD but at a computational cost comparable to the diagonal variant.

متن کامل

Fast Regression with an `∞ Guarantee∗

Sketching has emerged as a powerful technique for speeding up problems in numerical linear algebra, such as regression. In the overconstrained regression problem, one is given an n × d matrix A, with n d, as well as an n × 1 vector b, and one wants to find a vector x̂ so as to minimize the residual error ‖Ax− b‖2. Using the sketch and solve paradigm, one first computes S · A and S · b for a rand...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013